An efficient constrained training algorithm for feedforward networks
Identifieur interne : 002C24 ( Main/Exploration ); précédent : 002C23; suivant : 002C25An efficient constrained training algorithm for feedforward networks
Auteurs : D. A. Karras [Grèce] ; S. J. PerantonisSource :
- IEEE transactions on neural networks [ 1045-9227 ] ; 1995.
Descripteurs français
- Pascal (Inist)
English descriptors
- KwdEn :
Abstract
A novel algorithm is presented which supplements the training phase in feedforward networks with various forms of information about desired learning properties. This information is represented by conditions which must be satisfied in addition to the demand for minimization of the usual mean square error cost function. The purpose of these conditions is to improve convergence, learning speed, and generalization properties through prompt activation of the hidden units, optimal alignment of successive weight vector offsets, elimination of excessive hidden nodes, and regulation of the magnitude of search steps in the weight space. The algorithm is applied to several small- and large-scale binary benchmark training tasks, to test its convergence ability and learning speed, as well as to a large-scale OCR problem, to test its generalization capability. Its performance in terms of percentage of local minima, learning speed, and generalization ability is evaluated and found superior to the performance of the backpropagation algorithm and variants thereof taking especially into account the statistical significance of the results.
Affiliations:
Links toward previous steps (curation, corpus...)
- to stream PascalFrancis, to step Corpus: 000A35
- to stream PascalFrancis, to step Curation: 000964
- to stream PascalFrancis, to step Checkpoint: 000A35
- to stream Main, to step Merge: 002D88
- to stream Main, to step Curation: 002C24
Le document en format XML
<record><TEI><teiHeader><fileDesc><titleStmt><title xml:lang="en" level="a">An efficient constrained training algorithm for feedforward networks</title>
<author><name sortKey="Karras, D A" sort="Karras, D A" uniqKey="Karras D" first="D. A." last="Karras">D. A. Karras</name>
<affiliation wicri:level="1"><inist:fA14 i1="01"><s1>National res. cent. "Demokritos", inst. informatics telecommunications</s1>
<s2>Athens</s2>
<s3>GRC</s3>
</inist:fA14>
<country>Grèce</country>
<wicri:noRegion>Athens</wicri:noRegion>
</affiliation>
</author>
<author><name sortKey="Perantonis, S J" sort="Perantonis, S J" uniqKey="Perantonis S" first="S. J." last="Perantonis">S. J. Perantonis</name>
</author>
</titleStmt>
<publicationStmt><idno type="wicri:source">INIST</idno>
<idno type="inist">96-0015272</idno>
<date when="1995">1995</date>
<idno type="stanalyst">PASCAL 96-0015272 INIST</idno>
<idno type="RBID">Pascal:96-0015272</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000A35</idno>
<idno type="wicri:Area/PascalFrancis/Curation">000964</idno>
<idno type="wicri:Area/PascalFrancis/Checkpoint">000A35</idno>
<idno type="wicri:doubleKey">1045-9227:1995:Karras D:an:efficient:constrained</idno>
<idno type="wicri:Area/Main/Merge">002D88</idno>
<idno type="wicri:Area/Main/Curation">002C24</idno>
<idno type="wicri:Area/Main/Exploration">002C24</idno>
</publicationStmt>
<sourceDesc><biblStruct><analytic><title xml:lang="en" level="a">An efficient constrained training algorithm for feedforward networks</title>
<author><name sortKey="Karras, D A" sort="Karras, D A" uniqKey="Karras D" first="D. A." last="Karras">D. A. Karras</name>
<affiliation wicri:level="1"><inist:fA14 i1="01"><s1>National res. cent. "Demokritos", inst. informatics telecommunications</s1>
<s2>Athens</s2>
<s3>GRC</s3>
</inist:fA14>
<country>Grèce</country>
<wicri:noRegion>Athens</wicri:noRegion>
</affiliation>
</author>
<author><name sortKey="Perantonis, S J" sort="Perantonis, S J" uniqKey="Perantonis S" first="S. J." last="Perantonis">S. J. Perantonis</name>
</author>
</analytic>
<series><title level="j" type="main">IEEE transactions on neural networks</title>
<title level="j" type="abbreviated">IEEE trans. neural netw.</title>
<idno type="ISSN">1045-9227</idno>
<imprint><date when="1995">1995</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
<seriesStmt><title level="j" type="main">IEEE transactions on neural networks</title>
<title level="j" type="abbreviated">IEEE trans. neural netw.</title>
<idno type="ISSN">1045-9227</idno>
</seriesStmt>
</fileDesc>
<profileDesc><textClass><keywords scheme="KwdEn" xml:lang="en"><term>Algorithm</term>
<term>Learning</term>
<term>Neural network</term>
<term>Optimization</term>
<term>Performance</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr"><term>Réseau neuronal</term>
<term>Apprentissage</term>
<term>Optimisation</term>
<term>Algorithme</term>
<term>Performance</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front><div type="abstract" xml:lang="en">A novel algorithm is presented which supplements the training phase in feedforward networks with various forms of information about desired learning properties. This information is represented by conditions which must be satisfied in addition to the demand for minimization of the usual mean square error cost function. The purpose of these conditions is to improve convergence, learning speed, and generalization properties through prompt activation of the hidden units, optimal alignment of successive weight vector offsets, elimination of excessive hidden nodes, and regulation of the magnitude of search steps in the weight space. The algorithm is applied to several small- and large-scale binary benchmark training tasks, to test its convergence ability and learning speed, as well as to a large-scale OCR problem, to test its generalization capability. Its performance in terms of percentage of local minima, learning speed, and generalization ability is evaluated and found superior to the performance of the backpropagation algorithm and variants thereof taking especially into account the statistical significance of the results.</div>
</front>
</TEI>
<affiliations><list><country><li>Grèce</li>
</country>
</list>
<tree><noCountry><name sortKey="Perantonis, S J" sort="Perantonis, S J" uniqKey="Perantonis S" first="S. J." last="Perantonis">S. J. Perantonis</name>
</noCountry>
<country name="Grèce"><noRegion><name sortKey="Karras, D A" sort="Karras, D A" uniqKey="Karras D" first="D. A." last="Karras">D. A. Karras</name>
</noRegion>
</country>
</tree>
</affiliations>
</record>
Pour manipuler ce document sous Unix (Dilib)
EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/OcrV1/Data/Main/Exploration
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 002C24 | SxmlIndent | more
Ou
HfdSelect -h $EXPLOR_AREA/Data/Main/Exploration/biblio.hfd -nk 002C24 | SxmlIndent | more
Pour mettre un lien sur cette page dans le réseau Wicri
{{Explor lien |wiki= Ticri/CIDE |area= OcrV1 |flux= Main |étape= Exploration |type= RBID |clé= Pascal:96-0015272 |texte= An efficient constrained training algorithm for feedforward networks }}
This area was generated with Dilib version V0.6.32. |